An Information Criterion for Likelihood Selection
نویسندگان
چکیده
For a given source distribution, we establish properties of the conditional density achieving the rate distortion function lower bound as the distortion parameter varies. In the limit as the distortion tolerated goes to zero, the conditional density achieving the rate distortion function lower bound becomes degenerate in the sense that the channel it defines becomes error-free. As the permitted distortion increases to its limit, the conditional density achieving the rate distortion function lower bound defines a channel which no longer depends on the source distribution. In addition to the data compression motivation, we establish two results—one asymptotic, one nonasymptotic—showing that the the conditional densities achieving the rate distortion function lower bound make relatively weak assumptions on the dependence between the source and its representation. This corresponds, in Bayes estimation, to choosing a likelihood which makes relatively weak assumptions on the data generating mechanism if the source is regarded as a prior. Taken together, these results suggest one can use the conditional density obtained from the rate distortion function in data analysis. That is, when it is impossible to identify a “true” parametric family on the basis of physical modeling, our results provide both data compression and channel coding justification for using the conditional density achieving the rate distortion function lower bound as a likelihood.
منابع مشابه
Model Selection Based on Tracking Interval Under Unified Hybrid Censored Samples
The aim of statistical modeling is to identify the model that most closely approximates the underlying process. Akaike information criterion (AIC) is commonly used for model selection but the precise value of AIC has no direct interpretation. In this paper we use a normalization of a difference of Akaike criteria in comparing between the two rival models under unified hybrid cens...
متن کاملA jackknife type approach to statistical model selection
Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown tr...
متن کاملAn Introduction to Model Selection: Tools and Algorithms
Model selection is a complicated matter in science, and psychology is no exception. In particular, the high variance in the object of study (i.e., humans) prevents the use of Popper’s falsification principle (which is the norm in other sciences). Therefore, the desirability of quantitative psychological models must be assessed by measuring the capacity of the model to fit empirical data. In the...
متن کاملUsing Profile Likelihood for Semiparametric Model Selection with Application to Proportional Hazards Mixed Models
We consider selection of nested and non-nested semiparametric models. Using profile likelihood we can define both a likelihood ratio statistic and an Akaike information for models with nuisance parameters. Asymptotic quadratic expansion of the log profile likelihood allows derivation of the asymptotic null distribution of the likelihood ratio statistic including the boundary cases, as well as u...
متن کاملWeighted Likelihood Policy Search with Model Selection
Reinforcement learning (RL) methods based on direct policy search (DPS) have been actively discussed to achieve an efficient approach to complicated Markov decision processes (MDPs). Although they have brought much progress in practical applications of RL, there still remains an unsolved problem in DPS related to model selection for the policy. In this paper, we propose a novel DPS method, weig...
متن کاملThe Pennsylvania State University The Graduate School TWO TOPICS: A JACKKNIFE MAXIMUM LIKELIHOOD APPROACH TO STATISTICAL MODEL SELECTION AND A CONVEX HULL PEELING DEPTH APPROACH TO NONPARAMETRIC MASSIVE MULTIVARIATE DATA ANALYSIS WITH APPLICATIONS
This dissertation presents two topics from opposite disciplines: one is from a parametric realm and the other is based on nonparametric methods. The first topic is a jackknife maximum likelihood approach to statistical model selection and the second one is a convex hull peeling depth approach to nonparametric massive multivariate data analysis. The second topic includes simulations and applicat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 45 شماره
صفحات -
تاریخ انتشار 1999